I had come across this task few days ago. Hence, put it here for future reference. The requirement is simple. System need to scan a specific folder and import all files into database. Ideally, these are the steps how I do it:
- Create a import format file using BCP for BULK INSERT usage
- Use master..xp_cmdshell to get all the file names in to table variable
- Loop through the table and import data one by one using OPENROWSET and BULK
Create Sample Payment Table
Run Create_Payment_Table.sql to create a “Payment” Table. The columns must match with columns in CSV file.
Creating an XML Format file
Run
EXEC master..xp_cmdshell ‘bcp yourdbname.dbo.Payment format nul -T -t, -c -x -f C:\Payment.xml’
to create an XML format file for BULK INSERT usage.
May 9, 2007 at 10:53 pm
[…] may encounter this error while testing on the sample in “How to import multiple CSV files using MS SQL store procedure?“ Posted in […]
March 24, 2008 at 4:04 am
Get an “access denied” when I try running the above example. What do I need to do to correct this?
March 24, 2008 at 11:42 am
I’m having alot of trouble running the bcp statement above. Any help would be great.
March 24, 2008 at 10:07 pm
Hi cfigy, bcp issues,
Can you please provide more info on the error? Which part you get the error? Is your bcp working at the first place?
I would need more infor in order to help on these. Thanks.
YS
March 25, 2008 at 9:06 am
My cvs files do not have a header row, how would I use the above code modified for a cvs file without a header row? Also, I really don’t want to add the additional information to each row. I tried removing it, but incounter a select and insert statements don’t match error.
March 25, 2008 at 11:34 pm
Hi,
You have 2 issues here:
1. To import a file without header, simply change “FIRSTROW = 2” in usp_import_common.sql to “FIRSTROW = 1”.
2. For the additional information issues, you have to make sure “INSERT Payment…” query in usp_import_common.sql doesn’t include the additional columns. In this case, you should exclude “[FileName]” and “[BatchID]”. The section should be something like this:
SET @SQL1 = ” +
‘INSERT Payment’ +
‘(‘ +
‘ [InvoiceNo],’ +
‘ [Date],’ +
‘ [Status],’ +
‘ [CheckNumber],’ +
‘ [VoucherNo],’ +
‘ [PaymentAmount]’ +
‘) ‘
SET @SQL2 = ” +
‘SELECT’ +
‘ TMP.[InvoiceNo],’ +
‘ TMP.[Date],’ +
‘ TMP.[Status], ‘ +
‘ TMP.[CheckNumber],’ +
‘ TMP.[VoucherNo],’ +
‘ TMP.[PaymentAmount] ‘ +
‘FROM OPENROWSET(‘ +
‘ BULK ”’ + @SourceFile + ”’, ‘ +
‘ FORMATFILE = ”’ + @FormatFile + ”’, ‘ +
‘ FIRSTROW = 2 ) TMP’
Hope this will help to solve your problem
March 31, 2008 at 9:25 am
Thank You very much for the above code. It is very helpful. I have a question that you might be able to help me with. I’m importing several hundred files and each row has a date that i want to add a time to the date. There are acouple of other columns in the cvs files I would like to add or remove parts of the data. Is there an easy way to change or manipulate the data in the above code while it is being imported? (Thanks Again)
March 31, 2008 at 10:12 pm
You have to create the xml schema base on all the columns in your file. But, you can choose those fields required. Just select “TMP.[Columns]” according to your needs.
For the appending time to a date field, I am not sure what you want here. I assume u got a date column and want to append current time to that field and save to DB. I got a stupid but works method. You might have a better solution. First, I assume you got a date only columns in proper format (YourDateField). Then, proceed to get the current DateTime using GETDATE() function and convert to VARCHAR with time format only. Lastly, concatenate to your date field. Then, you will get your date column + current time. Here is the simple example:
DECLARE @YourDateField AS VARCHAR(50)
SET @YourDateField = ‘2008-03-31’
SELECT CAST(@YourDateField + ‘ ‘ + Convert(VARCHAR(50),getdate(), 14) AS DATETIME) AS DateTimeField
June 7, 2008 at 6:28 pm
I have sql server 2000 and gives me error “EXECUTE cannot be used as a source when inserting into a table variable.”
How to rectify that